专利摘要:

公开号:SE0950302A1
申请号:SE0950302
申请日:2009-05-05
公开日:2010-11-06
发明作者:Ola Pettersson;Kay Sponseller;Erik Waestlund
申请人:
IPC主号:
专利说明:

15 20 25 30 35 lowering the quality of life for the wheelchair boom, which becomes highly dependent on help from another person and is thereby limited in its independent movement.
In addition, if the user has motor and / or cognitive impairments that affect the ability to operate a joystick unit or a blowing and suction system, control is further made more difficult. The user may quickly become tired and may not be able to enter any desired change of direction or have difficulty understanding the instructions that can be given to the control system. This makes efficient use of the wheelchair impossible and often requires the presence of an accompanying person, which in turn means that the opportunity to fend for oneself and decide on one's own movement is severely limited.
The problem that the use of a control unit is difficult or impossible for a user with motor problems can be solved by using a wheelchair that is controlled by, for example, eye movements or voice commands. Such a technique is disclosed in WO2005 1 195 54 (Levy et al.) And although it can make it much easier for a user with only motor disabilities, but even if the cognitive ability is impaired, the technique can only be used to a limited extent or not at all. There is also no way to adapt a wheelchair to a specific user's needs other than to specially order the wheelchair from the beginning, something that will be unreasonably expensive.
There is therefore a great need for a disability aid that can make everyday life easier for users with cognitive disabilities and increase the independence of wheelchair users.
BRIEF DESCRIPTION OF THE INVENTION It is an object of the present invention to eliminate or at least minimize the above-mentioned problems, which are achieved by a control system, said system being adapted to be connected to a movable handicap aid such as a wheelchair or an AKKA plate on a such a way that information transfer can take place between said control and monitoring system and said handicap aids, wherein the system is also arranged to be adaptable to different control routines. In this way, a control system that is adapted to a user's special needs can be connected to an existing disability aid, so that a standard product through this addition can be particularly suited to a given person's cognitive and motor skills and facilitate movement and independent movement for this person. . According to one aspect of the invention, said system comprises a control unit for eye control. In this way, a user who has motor and / or cognitive difficulties that complicate the use of other control systems, such as joystick control, can control the handicap aid only by fixing the gaze on given fields on a screen, and depending on the desired complexity in the system simpler or more complex through such eye control.
According to another aspect of the invention, said eye control means further comprises means for capturing images arranged to take at least one image of an environment around the handicap aid, suitable for projecting on the screen. This allows the user to see a view of the surroundings and has an easier time understanding the connection between a direction indicated in a symbol field and the actual direction in which a movement would take place.
According to a further aspect of the invention, a separate joystick unit and a switch are provided whereby the control of the system can be switched between this joystick unit and the eye control unit. This makes it possible for a person other than the user, for example a companion, friend or relative, to take over control of the assistive device if necessary, either to prevent a potentially undesirable situation where the user may lose control of the device, or to relieve the user. by taking over the steering during part or all of the journey. This can also function as a safety system that enables extreme intervention in the event of a dangerous or undesirable situation.
According to another aspect of the invention, the control system further comprises a navigation unit with at least one sensor for detecting objects in the environment and a memory unit which is arranged to identify the position of the aid based on the objects which the sensor detects. This makes it possible for the aid to act on the basis of a complex instruction so that a movement to a desired position is achieved, and so that the journey to this position proceeds without colliding with surrounding objects or getting lost so that the system loses knowledge of its position. This can further relieve the user, who after entering a desired destination can relax during the journey there, without having to instruct the system about each step along the way. The user can still intervene if the journey is to be interrupted or if a journey to another destination is desired, but if nothing else happens, the navigation unit itself continues to control the aid.
According to a further aspect of the invention, the memory unit is further arranged to remember a number of given positions and to calculate a suitable path from the present position of the handicap aid to one of the given positions.
In this way, the best route can be chosen, perhaps along a wall or across a floor surface, and the position of fixed obstacles such as furniture can be taken into account in the calculation of the route.
It is also conceivable to select a series of stations along the road before the final goal is to be reached, so that the aid moves along a path chosen by the user and which can be re-selected in any given situation. By knowing obstacles such as walls, furniture and other things, the system can avoid collisions, but should an obstacle still get in the way, the system itself can easily brake in or turn away, thanks to the detection that takes place at the sensor and which can detect this type of obstacle.
According to another aspect of the invention, said navigation unit further comprises a position unit, connected to the memory unit, for determining the position of the control system in relation to the surroundings. In this way, a further level of complexity can be achieved, where the user can choose a destination freely based on a map image or an address, and the aid itself can navigate there along a road that is judged to be the most suitable. Thanks to the fact that an approximate position can be determined without having to relate to known objects in the immediate environment, good navigation can also be achieved outdoors in, for example, an urban environment, so that the user can stay in new environments and still handle navigation and control of the aid in a simple and practical way.
According to a further aspect of the invention, said navigation unit further comprises an input instrument connected to the memory unit, wherein said input instrument is arranged to receive instruction on a given position stored in the memory unit. Hereby the user can indicate the desired destination, either by an input instrument with symbols can press, so-called touchscreen, by joystick control or by eye control, for example. If it is suitable for the user, the eye control system described herein can be connected to the navigation device, so that both of these systems are integrated and controlled through the same screen where the user's point of view is detected.
BRIEF DESCRIPTION OF THE DRAWINGS The invention will be described in more detail below with reference to the accompanying drawings, in which: Fig. 1 shows a general view of a mobile handicap aid, Fig. 2 shows a schematic view of a control system for a mobile handicap aid , equipped with control and navigation units according to a preferred embodiment of the present invention, Fig. 3a shows a schematic view of a screen of a control unit according to the invention, Fig. 3b shows a schematic view of an alternative design of the screen from Fig. 3a.
DETAILED DESCRIPTION OF FIGURES Fig. 1 shows an overview view of a mobile handicap aid such as a wheelchair 1, comprising wheels 11, seat 12, footrest 13, armrests 14 and backrests 15 to support and facilitate the placement in the chair for a user. In the event that the user does not move and control the movement himself, for example by manually rolling the wheels, a control system 2 is placed on the aid and arranged to be able to drive the wheels 11. In conventional electric wheelchairs 1, the user can steer by access to a control unit, e.g. joystick unit 3 which is positioned so that the user can easily reach it, for example in connection with the armrest 14, and is connected to the control system 2. When the user wants the wheelchair 1 to move in a particular direction, he or she can give the control system 2 instructions on this. by moving a joystick of the joystick unit 3, and the signal then generated is interpreted by the control system 2 and results in a movement of the wheels 11 being performed.
In the event that a person other than the user himself, for example a friend, relative or companion, must be able to surface and steer the wheelchair 1, handles 16 are available, for example in connection with the backrest 15, so that a simple and manageable steering is possible.
Fig. 2 shows a schematic view of a control system 2 for a mobile handicap aid 1, equipped with control and navigation units 4, 5 according to a preferred embodiment of the present invention. There, the control system 2 is shown with connections 21, 22 where external units can be connected. In known control systems with joystick control, as described above, there is generally at least one connection 21 which is intended for connecting said joystick unit 3, and to the same connection 21 a control unit 4 for eye control can be connected instead of said joystick unit 3.
In the event that the user himself has difficulty handling a joystick, for example due to impaired mobility or impaired cognitive function, a system that responds to eye movements means a significant improvement and increased opportunities for this user to control their own progress and move. independently. Said control unit 4 for eye control comprises a screen 41 where at least one, preferably at least two, more preferably at least four, symbols are displayed, which for example represent different available directions of movement of the wheelchair 1. The control unit 4 also comprises means 42 for detecting a point of view of the user, ie. the specific point at which the user is staring, and determine if that point of view is within the limits of a symbol on screen 41 or not. Interpretation means 43, for example in the form of a software program of the control unit 4, converts this information into a signal which gives instruction regarding the movement ordered by the user through the eye signal, and forwards this instruction to the control system 2. Thanks to the connection via the same connection 21 intended for a joystick unit 3, and thanks to the design of the control unit 4 itself, this can be connected to an existing control system without the need for significant modifications. The control system 2 generally perceives the newly connected control unit 4 as if it were the joystick unit 3, and can therefore respond as reliably and controlledly to eye control signals as it usually does to the joystick signals.
To further simplify for a user with impaired cognitive ability, a means for capturing images, for example a camera 44, may be provided to the control unit 4 and record an image, in the form of a still image or moving images, of the area located in the intended object (s). the directions of movement of the wheelchair 1. This image can then interact with the symbol fields of the screen 41 in such a way, for example by overlay, that the user can fix his gaze on a direction of the screen and see it as well as a symbol denoting a certain direction, as an image of the segment of the environment in which the wheelchair 1 would move, if the gaze is fixed on the associated symbol. This makes it easier for the user by connecting the symbol and the actual direction, and to see where the journey is going, the user can follow this on the screen rather than having to look up at the surroundings. The image can also be manipulated to further facilitate the user, for example by increasing the contrast of the image, reducing the number of colors, enhancing lines or highlighting doors and other known objects in the environment. If an imaging device such as an IR camera or lidar is used, an image can be generated for the user even in low light conditions, so that the user does not have to find a light button or otherwise generate light before an independent journey using the screen and the generated image can be implemented.
The screen 41 can be placed somewhere adjacent to the wheelchair 1 where it is easy and convenient for the specific user to use the eye control function.
This may mean that the screen is placed in front or at the side, low or high, left or right, depending on the user's cognitive abilities and / or eyesight.
By means of a simple fastening device, such assembly can be facilitated. The design of the control unit 4 may differ depending on the needs of the user and the level of control that the control unit 4 may need to exercise over the movement of the wheelchair 1. Figs. 3a and 3b show two examples of preferred embodiments of the inserts 41 and two different sets of symbol fields. In Fig. 3a there are three different symbol fields 4la, 4lb, 4lc which can be provided with symbols which indicate different intended directions of movement. When the user's point of view is within the edges of any of these symbol fields, for example 4lb, the system will interpret this as meaning that the user wishes to move in the direction indicated by the symbol of the field 4lb. When the user then wants the movement to stop, this can be signaled by shifting the eye point fl so that it falls outside the symbol field 4lb, either by fl moving it away to an environment 4ld around the symbol fields that are outside the symbol-bearing areas or away from the screen completely, or by the gaze fl is shifted to one of the other symbol fields 41a, 4lc, so that the user thereby commands movement in another direction.
Alternatively, if the user has a slightly higher cognitive ability, the system may interpret a gaze within a symbol field, for example 41a, as a desire for movement, but require the user to direct the gaze towards the same field 4la again or towards another symbol field 4lb, 4lc to the movement shall cease. If the user looks away from the screen completely or looks at an area 4ld which does not in itself constitute an instruction to the control unit 4, this is not interpreted as an instruction that the movement should cease.
Which of these alternative control routines is chosen may depend on your things, including the user's ability to interact with the system both cognitively and physically, ie both the ability to understand how the system works and the physical ability to maintain a given gaze direction. For a more high-functioning user, there may also be a social advantage in not having to fixate on a given point to create a movement but instead be able to interact with their surroundings during the journey and only focus on the screen when a change of movement is desired.
Fig. 3b shows an alternative screen configuration where practically the entire surface of the screen 41 consists of symbol fields 4la °, 4lb °, 4lc ', 41 ° d. This can make it easier for the user to focus on the correct symbol field, since these can be made larger, and by placing the symbols in a logical way, for example in the form of arrows, the control unit 4 can be made more comprehensible to the user.
It is to be understood that the number of symbol fields of the screen 41 may vary depending on the specific user or the surrounding environment. The symbol fields can also be arranged to overlap or combine to some extent so that more complex directions can be specified, or changed during use so that the number of fields and their location change depending on the environment.
Thanks to the pictures that can be taken with the camera 44, the control unit 4 can also be made more user-friendly, so that the picture of the surroundings is displayed on the scanners 41 together with suitable symbols of the symbol fields 4la, 41b, 4lc or 4la ", 4lb", 4lc ", 4ld" .
This allows the user to see where in the room a view of a particular field indicates for movement and does not have to look up from the screen 41 to see where the movement is going.
A plurality of cameras 44 can be provided to the control unit 4 and film different segments of the environment. These can be displayed in different symbol fields or across the entire screen 41, based on what is best judged for the user's needs.
It is advantageous that even staff who assist the user can steer the wheelchair 1, perhaps at times where the user prefers to rest or where there is a risk of an accident due to the wheelchair 1's travel. To meet this need, a separate joystick unit 3 ° may be provided at the control unit 4 together with a switch 31 "or the like which allows the unit 4 to switch between inputs from the eye controls 41, 42, 43, 44 and inputs from the joystick 3 °. In this way, a person in the vicinity of the wheelchair 1 can thus take over the control of the wheelchair 1 when needed, and by placing both the joystick 3 "and the switch 31" so that it is easily accessible to the environment, for example in connection with the handles 16 or the backrest 15.
It is to be understood that the various components belonging to the control unit 4 may be separate or integrated in the same component, as is appropriate for the specific application.
To enable more complicated navigation, a navigation unit 5 can be connected to the control system 2, either in combination with the control unit 4 or on its own. Said navigation unit comprises at least one sensor 51 which is arranged to sense the surroundings around the aid 1 in order to be able to detect walls or other obstacles or risks, such as for instance stairs or thresholds, and to thereby determine the position of the aid 1. A number of different types of sensors can be used, for example sonar, where upwards of 16 sensors evenly distributed around the aid may be needed to determine position, or sufferers or cameras, where one sensor may be sufficient. The sensors used can measure the distances to obstacles in the environment and based on this create a memory image of what the current position looks like. Then the navigation unit 5 can compare and match new sensor data with those already stored and in this way determine the position of the aid 1.
By interpreting signals from this sensor 51 or a number of sensors 51, the navigation unit 5 can generate stop signals to the control system 2 when an obstacle or a danger appears in the path of the wheelchair. It is also possible to allow the wheelchair 1 to follow a given path, such as for example a wall delimiting a room, and in this way achieve a controlled and predictable movement of the wheelchair 1. Such instructions, as well as others described in more detail below , can be specified by a user by handling an instrument 52 for input, for example a device which may be provided with buttons or a touch screen, and where one can set the desired route by pressing buttons or symbols. It is also possible to integrate the navigation unit 5 with the control unit 4 for eye control 4 and to let the screen 41 used to indicate a direction also be used for these more complex instructions. The navigation unit 5 can in this case be connected to the control system 2 via the control unit 4 or directly in the connection 21, and can in that case said navigation unit 5 be used independently connected through this connection 21 or through another connection option 22 provided at the control system 2.
Alternatively, the navigation unit 5 can be combined only with the components 41, 42 of the control unit 4 needed to identify the point of view of a user, and these can be integrated with the unit 5 as a replacement for the input instrument 52.
The navigation unit 5 may also comprise a memory unit 53 which has a programmed representation of the surroundings of the wheelchair 1, such as for example the room in which it is most recently located in the user's home, or the whole user's home. By having knowledge of how the space is designed and which walls, stairs or other fixed obstacles exist, the memory unit 53 can calculate a suitable route to a given point. If the user then wants to go from a kitchen to a bedroom, thanks to an interaction between the memory unit 53 and the input instrument 52 (or the screen 41 if this is connected to the navigation unit 5) it may be enough to enter the bedroom as the destination, then the wheelchair itself can find a suitable path and move to the desired location, either by following walls or by moving directly over a floor surface. Known obstacles, such as walls, among others, can then be avoided due to the fact that these are already known to the system. Unexpected obstacles, such as moving furniture or closed doors, pets or persons, can be detected by means of the sensor 51 and the appearance of such an obstacle can trigger an immediate stop signal of the navigation unit 5 so that it stops or brakes. and seeks another path around the obstacle. Fixed furniture can of course be programmed into the memory unit 53 so that these are known from the beginning.
The navigation unit 5 can also, for example by recognizing the objects or obstacles identified by means of the sensor or sensors 51 as described above, estimate the own position of the wheelchair 1 in relation to its surroundings, and this information can be used to calculate the best path between current position and the desired goal. The speed at which the wheelchair 1 moves can also be detected in some suitable way, so that the need to turn and / or brake can be put in relation to how fast an detected obstacle approaches.
A particular route can be learned by traveling the aid 1 along this route, for example by a companion driving the aid 1 along the desired path, and the system of the navigation unit 5 can then remember this road and store it in an internal map located at the navigation unit. 5, for example in connection with or integrated with the memory unit 53 To the navigation unit 5 a satellite navigation unit 54, such as a GPS unit, GLONASS unit or unit connectable to another suitable system, can also be connected and used to determine the position of the wheelchair 1 in relation to the surroundings.
Thanks to such a unit 54, one can also navigate the wheelchair 1 outdoors. The user can then easily enter their desired destination, either by using simple symbols that represent pre-specified locations, such as the person's home and places he or she often needs to get to, or by entering an address or pointing to a map.
The navigation unit 5 can calculate the best route from the place where it is currently located to the one that is the specified destination, for example by matching the route to previously learned desired routes or by choosing a new route, and for example information about where there are pedestrian crossings or which sidewalks are easiest to use and can be used to find the best route. Other factors can also be considered, and by adjusting in the navigation unit 5, new factors can be entered and new criteria used for what is to be considered the best alternative. As described above, the system itself can learn by evaluating a route that has been ridden once, and thus from time to time remember what the user has previously indicated as suitable routes, desired goals or special needs, so that this can be reused the next time a similar request stated. This satellite navigation unit 54 and the sensor 51 can, together or separately, be referred to as a position unit 51, 54 for determining the position of the wheelchair 1 in relation to the surroundings.
During all operations with a handicap aid 1 and control and navigation units 4, 5 according to the invention, the user should still have the opportunity to interrupt the journey or call for an emergency stop if this is necessary.
The use of a wheelchair 1 with control and / or navigation units 4, 5 according to the invention will now be briefly elucidated.
Based on the user's motor and cognitive abilities, as well as on the user's independent movement needs, the control unit / units or navigation to be used is selected and adapted for the specific application. If the user has motor and / or cognitive problems that make it impossible to control with a joystick, for example, the eye control unit 4 is better suited, and if independent navigation indoors or in larger areas is desired without the user having to control the process in detail, the navigation unit 5 is suitable. Each such unit 4, 5 can then be modified so that the available functions are as closely adapted as possible to the user's abilities and needs, and then the units 4, 5 are mounted towards the control system 2 of the wheelchair 1, for example through an accessible connection 21 which is often is provided to enable control with a joystick unit 3. A control unit 3 "can also be connected to the units 4, 5, for example with joystick control, which can be mounted on the wheelchair 1 in connection with the backrest 15 or the handles 16 so that it is easily accessible for an accompanying person. The control of the control system 2 can then be switched between the eye control unit 4 for the user and this second control unit 3 "by manipulating a switch 31" which is accessible to the accompanying person. Thanks to the fact that the control unit 4 and the navigation unit 5 can be connected to existing handicap aids 1 of different makes, an easy adaptation to an individual user can be achieved, and the aid 1 is changed each time it is to be used by a new person.
The units 4, 5 are mounted in connection with the wheelchair 1, for example near the control system 2 by some suitable type of attachment. If a screen 41 for eye control is used, it is mounted where the user can conveniently fix his gaze on it, perhaps in connection with an armrest 14, and if an input instrument 52 with a touch screen is used, it is mounted in a corresponding position. Thanks to the connection of the units 4, 5 to the control system 2, these can be supplied with power by the control system 2, but it is also possible to imagine a separate power supply to the units 4, 5. After assembly, the system is ready for use, so that the user can now take a seat in the wheelchair 1 and control the journey yourself in a way that is adapted to his or her abilities. If another user is to use the same wheelchair 1, the system with the units 4, 5 can be modified to fit the new individual, or retained as before if this is more practical. If the system is self-learning, data for different users can be saved separately so that the system can easily switch from one user to another without this data being lost.
A variety of types of sensors can be used with the invention to detect obstacles, determine position and determine speed, including sonar, lidar, radar or a camera unit, as well as tachometers connected to the wheels II or accelerometer.
The invention is not limited by what has been described above, but can be varied within the scope of the appended claims. It will be appreciated, for example, that the measured focal point need not be on a computer screen but may be elsewhere, or that other means than wheelchairs may be used with the invention. Several other modifications are also possible, as will be appreciated by those skilled in the art.
权利要求:
Claims (1)
[1]
1. 0 15 20 25 30 35 13 PATENT REQUIREMENTS 1. Control and monitoring system, wherein said system is adapted to be connected to a movable handicap aid such as a wheelchair or an AKKA plate in such a way that information transfer can take place between said control and control system and the said handicap aids, characterized in that the system is also arranged to be adaptable to different control routines. Control and monitoring system according to claim 1, characterized in that said system comprises a control unit (4) for eye control. . Control and monitoring system according to claim 2, characterized in that said control unit (4) for eye control comprises means (42) for detecting a gaze point of a user, at least one screen (41) provided with at least one symbol field (41a , 41b, 4lc), and interpreting means (43) for interpreting a position on a symbol field (4la, 41b, 4lc) of the measured gaze point as an instruction to the control system. Control and monitoring system according to claim 3 or 4, characterized in that said eye control means further comprises means (44) for image capture arranged to take at least a picture of an environment around the disability aid (1), wherein said image can be projected on the screen (41). Control and monitoring system according to one of the preceding claims, characterized in that a separate oysters unit (3 °) is provided and that the control of the control system can be switched between the separate joystick unit (3 °) and the control unit (4 °) by means of a switch (31 "). ). Control and monitoring system according to one of the preceding claims, characterized in that said system further comprises a navigation unit (5). . Control and monitoring system according to claim 6, characterized in that said navigation unit (5) comprises at least one sensor (51) for detecting objects in an environment around the handicap aid and a memory unit (53) arranged to identify the handicap aid ( l) position based on the objects detected by said sensor (51). 10 15 20 25 30 10. ll. 12. 13. 14. Control and monitoring system according to claim 7, characterized in that said memory unit (53) is further arranged to remember a number of given positions and to calculate a suitable path from the current position of the disability aid (1) to one of the given positions. . Control and monitoring system according to claim 7 or 8, characterized in that said navigation unit (5) is further arranged to prevent collision with the objects detected by the sensor (51) by changing a direction of movement and / or speed of the handicap aid ( 1). Control and monitoring system according to any one of claims 7-9, characterized in that said navigation unit (5) further comprises a position unit (51, 54), connected to the memory unit (53), for determining the position of the control system in relation to the environment. Control and monitoring system according to any one of claims 7-10, characterized in that said navigation unit (5) further comprises an input instrument (52, 41, 42) connected to the memory unit (53), wherein said input instrument (52, 41, 42) is arranged to receive instruction on a given position stored in the memory unit (53). Control and monitoring system according to claim 1 1, characterized in that the navigation unit (5) is further arranged to be able to independently steer a handicap aid (1) towards a given position, specified by a user with the aid of the input instrument (52). Disabled aids, such as a wheelchair or an AKIQA plate, for moving at least one user, characterized in that said disabled aids are equipped with at least one control and monitoring system according to any one of claims 1-12.
类似技术:
公开号 | 公开日 | 专利标题
SE0950302A1|2010-11-06|Control and control system for a mobile disability aid
JP2017024653A|2017-02-02|Occupant information acquisition device and vehicle control system
US7204328B2|2007-04-17|Power apparatus for wheelchairs
Leishman et al.2010|Smart wheelchair control through a deictic approach
US20150359691A1|2015-12-17|Movement assistance robot
JP5023328B2|2012-09-12|Meal support system
Aruna et al.2014|Voice recognition and touch screen control based wheel chair for paraplegic persons
Page et al.2017|Smart walkers: an application-oriented review
JP6137924B2|2017-05-31|Round-trip support robot and control program for round-trip support robot
Noman et al.2018|A New Design Approach for Gesture Controlled Smart Wheelchair Utilizing Microcontroller
KR101979846B1|2019-05-20|Intelligent line tracing moving system and walking robot
Dalsaniya et al.2016|Smart phone based wheelchair navigation and home automation for disabled
Moreira et al.2019|Smart and assistive walker–asbgo: rehabilitation robotics: a smart–walker to assist ataxic patients
JP5084756B2|2012-11-28|Autonomous mobile wheelchair
CN109334573B|2020-06-30|Automobile exterior rearview mirror adjusting method and device and automobile
JP6815891B2|2021-01-20|Walking support robot and walking support system
JP4904835B2|2012-03-28|Vehicle control device
TWI549668B|2016-09-21|Wheelchair control system and control method thereof
EP3575257A1|2019-12-04|Control of elevator with gaze tracking
JP6709989B2|2020-06-17|Electric wheelchair
JPH1099389A|1998-04-21|Walking training machine
KR20180024428A|2018-03-08|An apparatus for guiding for a handicapped person using color sensor
KR101860370B1|2018-05-24|a Public HMD apparatus and a game machine having the same
WO2016136127A1|2016-09-01|Display device, display method and program
Alibhai et al.2020|A Human-Computer Interface For Smart Wheelchair Control Using Forearm EMG Signals
同族专利:
公开号 | 公开日
WO2010128941A1|2010-11-11|
EP2427159B1|2014-03-26|
US20120046821A1|2012-02-23|
JP2012525914A|2012-10-25|
CA2760125A1|2010-11-11|
EP2427159A4|2013-05-08|
US8862307B2|2014-10-14|
SE533876C2|2011-02-15|
EP2427159A1|2012-03-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US3712716A|1971-04-09|1973-01-23|Stanford Research Inst|Eye tracker|
US5410376A|1994-02-04|1995-04-25|Pulse Medical Instruments|Eye tracking method and apparatus|
US6141034A|1995-12-15|2000-10-31|Immersive Media Co.|Immersive imaging method and apparatus|
US5802220A|1995-12-15|1998-09-01|Xerox Corporation|Apparatus and method for tracking facial motion through a sequence of images|
JPH1166320A|1997-08-12|1999-03-09|Mitsubishi Electric Corp|Eye image tracking device|
US6320610B1|1998-12-31|2001-11-20|Sensar, Inc.|Compact imaging device incorporating rotatably mounted cameras|
US6401050B1|1999-05-21|2002-06-04|The United States Of America As Represented By The Secretary Of The Navy|Non-command, visual interaction system for watchstations|
AUPQ896000A0|2000-07-24|2000-08-17|Seeing Machines Pty Ltd|Facial image processing system|
US6971471B2|2001-12-07|2005-12-06|General Motors Corporation|Multi-directional drive|
US6842692B2|2002-07-02|2005-01-11|The United States Of America As Represented By The Department Of Veterans Affairs|Computer-controlled power wheelchair navigation system|
US7046924B2|2002-11-25|2006-05-16|Eastman Kodak Company|Method and computer program product for determining an area of importance in an image using eye monitoring information|
US6637883B1|2003-01-23|2003-10-28|Vishwas V. Tengshe|Gaze tracking system and method|
US6842670B2|2003-05-02|2005-01-11|Chung Shan Institute Of Science And Technology|Eye-tracking driving system|
US7872635B2|2003-05-15|2011-01-18|Optimetrics, Inc.|Foveated display eye-tracking system and method|
US7949616B2|2004-06-01|2011-05-24|George Samuel Levy|Telepresence by human-assisted remote controlled devices and robots|
US7689010B2|2004-12-03|2010-03-30|Invacare International Sarl|Facial feature analysis system|
US20080228384A1|2007-03-17|2008-09-18|Erickson Clinton W|Navigational system for a personal mobility device|NZ592381A|2011-04-20|2013-07-26|Dynamic Controls|Steering control system for a manual wheelchair for maintaining straight line path of travel|
KR20130065126A|2011-12-09|2013-06-19|한국전자통신연구원|Apparatus and method for generating path of mobile robot or grond vehicle|
US20140096642A1|2012-10-05|2014-04-10|Remy Technologies, Llc|Starter motor|
JP2014089513A|2012-10-29|2014-05-15|Denso Corp|Image generation apparatus and image generation program|
US9757054B2|2013-08-30|2017-09-12|Elwha Llc|Systems and methods for warning of a protruding body part of a wheelchair occupant|
US9488482B2|2013-08-30|2016-11-08|Elwha Llc|Systems and methods for adjusting a contour of a vehicle based on a protrusion|
WO2015073521A2|2013-11-18|2015-05-21|New York Institute Of Technology|Motorized walker|
TWI542339B|2013-12-12|2016-07-21|dun-ji Li|Wheelchair equipment and systems for self care|
US9383751B2|2013-12-12|2016-07-05|Medicraft HoldingsCo., Ltd.|Self operable wheelchair|
EP3122201A4|2014-03-24|2017-12-20|Ahmad Alsayed M. Alghazi|Multi-functional smart mobility aid devices and methods of use|
WO2015167411A1|2014-04-29|2015-11-05|Mutlu Lütfi|Smart navigation system for brainwave controlled wheelchairs|
JP6401679B2|2015-08-26|2018-10-10|日本電信電話株式会社|Collision avoidance system and avoidance device|
WO2017067990A1|2015-10-19|2017-04-27|Usadel Gmbh|Method for visually displaying the positions of selected destinations in maps in an altered manner|
WO2017147347A1|2016-02-23|2017-08-31|Deka Products Limited Partnership|Mobility device control system|
US10926756B2|2016-02-23|2021-02-23|Deka Products Limited Partnership|Mobility device|
US10908045B2|2016-02-23|2021-02-02|Deka Products Limited Partnership|Mobility device|
CA3024145A1|2016-04-14|2017-10-19|Deka Products Limited Partnership|User control device for a transporter|
TR201613586A2|2016-09-28|2016-11-21|Mehmet Doenmez|ELECTRONIC AND MOTORIZED WHEELCHAIR SYSTEM CONTROLLED BY EYE MOVEMENTS|
US10605614B2|2016-10-17|2020-03-31|International Business Machines Corporation|Generation of route network data for movement|
CN107212971B|2017-07-11|2018-09-21|薛红|A kind of folding intelligence stretcher|
CN107260412B|2017-07-11|2019-01-04|雷磊|Folding intelligence stretcher|
CN111542294A|2017-12-29|2020-08-14|四川金瑞麒智能科学技术有限公司|Intelligent wheelchair system|
US10660806B1|2020-01-15|2020-05-26|Blanche Michelle Nelson-Herron|Wheelchair safety systems and related methods|
法律状态:
2020-05-12| NUG| Patent has lapsed|
优先权:
申请号 | 申请日 | 专利标题
SE0950302A|SE533876C2|2009-05-05|2009-05-05|Control and control system for a mobile disability aid|SE0950302A| SE533876C2|2009-05-05|2009-05-05|Control and control system for a mobile disability aid|
CA2760125A| CA2760125A1|2009-05-05|2010-05-05|Steering and control system for a vehicle for the disabled|
EP10772342.1A| EP2427159B1|2009-05-05|2010-05-05|Steering and control system for a vehicle for the disabled|
PCT/SE2010/050495| WO2010128941A1|2009-05-05|2010-05-05|Steering and control system for a vehicle for the disabled|
JP2012509766A| JP2012525914A|2009-05-05|2010-05-05|Vehicle handling and control system for disabled people|
US13/266,022| US8862307B2|2009-05-05|2010-05-05|Steering and control system for a vehicle for the disabled|
[返回顶部]